Endpoint Post Dynamic Endpoint
Name
post_[dynamic_endpoint_name]
Description
Related Tables
Parameters
| Parameter | Required | Options | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| There are no parameters | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Output Columns
| Label | Data Type (SSIS) | Data Type (SQL) | Length | Description |
|---|---|---|---|---|
| index |
DT_I8
|
bigint
|
||
| reason |
DT_WSTR
|
nvarchar(255)
|
255 | |
| location |
DT_WSTR
|
nvarchar(255)
|
255 | |
| debugInfo |
DT_WSTR
|
nvarchar(4000)
|
4000 | |
| message |
DT_WSTR
|
nvarchar(4000)
|
4000 |
Input Columns
| Label | Data Type (SSIS) | Data Type (SQL) | Length | Description |
|---|---|---|---|---|
| -Dynamic- |
DT_WSTR
|
nvarchar(4000)
|
4000 | |
| [Dynamic Column]_DT |
DT_DBTIMESTAMP
|
datetime
|
Alias for any TIMESTAMP fields, allows alternate way to map DateTime date to UNIX timestamp. BigQuery allows to use Timestamp or ISO date as input for INSERT/UPDATE operation on TIMESTAMP field. |
Examples
SSIS
Use Google BigQuery Connector in API Source or in API Destination SSIS Data Flow components to read or write data.
API Destination
This Endpoint belongs to the [Dynamic Table] table, therefore it is better to use it, instead of accessing the endpoint directly. Use this table and table-operation pair to post dynamic endpoint:
| There are no parameters to configure. |
ODBC application
Use these SQL queries in your ODBC application data source:
INSERT datetime / Unix timestamp values (Native SQL)
<p><strong>Native SQL</strong> โ the SQL runs on the BigQuery side (server-side). Prefix with <code>#DirectSQL</code> to use this mode. Shows how to insert datetime or Unix timestamp values into a TIMESTAMP column (datetime strings or Unix epoch, with or without milliseconds). Use the <code>_DT</code> alias on the column if your ETL tool expects a datetime type.</p>
#DirectSQL
--DateTime Without milliseconds
INSERT INTO TestDataset.DataTypeTest (ColInteger,ColString,ColTimestamp) VALUES(1004,'My date time','2025-01-31T23:59:59');
--DateTime with milliseconds
INSERT INTO TestDataset.DataTypeTest (ColInteger,ColString,ColTimestamp) VALUES(1004,'My date time','2025-01-31T23:59:59.1234');
--DateTime alias field (e.g. {your-timestamp-column}_DT) - Useful if ETL tool needs DateTime datatype rather than Numeric Unix Epoch
INSERT INTO TestDataset.DataTypeTest (ColInteger,ColString,ColTimestamp_DT) VALUES(1004,'My date time','2025-01-31T23:59:59');
--Unix Epoch format (without milliseconds)
INSERT INTO TestDataset.DataTypeTest (ColInteger,ColString,ColTimestamp) VALUES(1006,'My date time',1738367999);
--Unix Epoch format (with milliseconds)
INSERT INTO TestDataset.DataTypeTest (ColInteger,ColString,ColTimestamp) VALUES(1006,'My date time',1738367999.12345);
INSERT single row
<p>Inserts a single row using ZappySys query syntax. You can optionally use <code>WITH(DatasetId='...', ProjectId='...', Output='*')</code>.</p>
INSERT INTO MyBQTable1(SomeBQCol1, SomeBQCol2) VALUES(1,'AAA')
--WITH(DatasetId='TestDataset',Output='*')
--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')
INSERT multiple rows from SQL Server
<p>Bulk insert into a BigQuery table from Microsoft SQL Server. The column list is taken from the source query; use SQL aliases so column names match the target table.</p>
INSERT INTO MyBQTable1
SOURCE(
'MSSQL'
, 'Data Source=localhost;Initial Catalog=tempdb;Integrated Security=true'
, 'SELECT Col1 AS SomeBQCol1, Col2 AS SomeBQCol2 FROM SomeTable WHERE SomeCol=123'
)
--WITH(DatasetId='TestDataset',Output='*')
--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')
INSERT multiple rows from any ODBC source (DSN)
<p>Bulk insert into a BigQuery table from any ODBC source (DSN). The column list comes from the source query; use aliases so names match the target table.</p>
INSERT INTO MyBQTable1
SOURCE(
'ODBC'
, 'DSN=MyDsn'
, 'SELECT Col1 AS SomeBQCol1, Col2 AS SomeBQCol2 FROM SomeTable WHERE SomeCol=123'
)
WITH(DatasetId='TestDataset')
INSERT multiple rows from JSON files / API
<p>Bulk insert into a BigQuery table from JSON (URL or file). Column list comes from the source query; use aliases to match the target table. In the code, an ODBC connection to a JSON source is used (e.g. ZappySys ODBC JSON driver); you can use a similar approach with CSV or XML sources.</p>
INSERT INTO MyBQTable1
SOURCE(
'ODBC'
, 'Driver={ZappySys JSON Driver};Src=''https://some-url/get-data'''
, 'SELECT Col1 AS SomeBQCol1, Col2 AS SomeBQCol2 FROM _root_'
)
--WITH(DatasetId='TestDataset',Output='*')
--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')
post_[dynamic_endpoint_name] endpoint belongs to
[Dynamic Table]
table(s), and can therefore be used via those table(s).
SQL Server
Use these SQL queries in SQL Server after you create a data source in Data Gateway:
INSERT datetime / Unix timestamp values (Native SQL)
<p><strong>Native SQL</strong> โ the SQL runs on the BigQuery side (server-side). Prefix with <code>#DirectSQL</code> to use this mode. Shows how to insert datetime or Unix timestamp values into a TIMESTAMP column (datetime strings or Unix epoch, with or without milliseconds). Use the <code>_DT</code> alias on the column if your ETL tool expects a datetime type.</p>
DECLARE @MyQuery NVARCHAR(MAX) = '#DirectSQL
--DateTime Without milliseconds
INSERT INTO TestDataset.DataTypeTest (ColInteger,ColString,ColTimestamp) VALUES(1004,''My date time'',''2025-01-31T23:59:59'');
--DateTime with milliseconds
INSERT INTO TestDataset.DataTypeTest (ColInteger,ColString,ColTimestamp) VALUES(1004,''My date time'',''2025-01-31T23:59:59.1234'');
--DateTime alias field (e.g. {your-timestamp-column}_DT) - Useful if ETL tool needs DateTime datatype rather than Numeric Unix Epoch
INSERT INTO TestDataset.DataTypeTest (ColInteger,ColString,ColTimestamp_DT) VALUES(1004,''My date time'',''2025-01-31T23:59:59'');
--Unix Epoch format (without milliseconds)
INSERT INTO TestDataset.DataTypeTest (ColInteger,ColString,ColTimestamp) VALUES(1006,''My date time'',1738367999);
--Unix Epoch format (with milliseconds)
INSERT INTO TestDataset.DataTypeTest (ColInteger,ColString,ColTimestamp) VALUES(1006,''My date time'',1738367999.12345);';
EXEC (@MyQuery) AT [LS_TO_GOOGLE_BIGQUERY_IN_GATEWAY];
INSERT single row
<p>Inserts a single row using ZappySys query syntax. You can optionally use <code>WITH(DatasetId='...', ProjectId='...', Output='*')</code>.</p>
DECLARE @MyQuery NVARCHAR(MAX) = 'INSERT INTO MyBQTable1(SomeBQCol1, SomeBQCol2) VALUES(1,''AAA'')
--WITH(DatasetId=''TestDataset'',Output=''*'')
--WITH(DatasetId=''TestDataset'',ProjectId=''MyProjectId'',Output=''*'')';
EXEC (@MyQuery) AT [LS_TO_GOOGLE_BIGQUERY_IN_GATEWAY];
INSERT multiple rows from SQL Server
<p>Bulk insert into a BigQuery table from Microsoft SQL Server. The column list is taken from the source query; use SQL aliases so column names match the target table.</p>
DECLARE @MyQuery NVARCHAR(MAX) = 'INSERT INTO MyBQTable1
SOURCE(
''MSSQL''
, ''Data Source=localhost;Initial Catalog=tempdb;Integrated Security=true''
, ''SELECT Col1 AS SomeBQCol1, Col2 AS SomeBQCol2 FROM SomeTable WHERE SomeCol=123''
)
--WITH(DatasetId=''TestDataset'',Output=''*'')
--WITH(DatasetId=''TestDataset'',ProjectId=''MyProjectId'',Output=''*'')';
EXEC (@MyQuery) AT [LS_TO_GOOGLE_BIGQUERY_IN_GATEWAY];
INSERT multiple rows from any ODBC source (DSN)
<p>Bulk insert into a BigQuery table from any ODBC source (DSN). The column list comes from the source query; use aliases so names match the target table.</p>
DECLARE @MyQuery NVARCHAR(MAX) = 'INSERT INTO MyBQTable1
SOURCE(
''ODBC''
, ''DSN=MyDsn''
, ''SELECT Col1 AS SomeBQCol1, Col2 AS SomeBQCol2 FROM SomeTable WHERE SomeCol=123''
)
WITH(DatasetId=''TestDataset'')';
EXEC (@MyQuery) AT [LS_TO_GOOGLE_BIGQUERY_IN_GATEWAY];
INSERT multiple rows from JSON files / API
<p>Bulk insert into a BigQuery table from JSON (URL or file). Column list comes from the source query; use aliases to match the target table. In the code, an ODBC connection to a JSON source is used (e.g. ZappySys ODBC JSON driver); you can use a similar approach with CSV or XML sources.</p>
DECLARE @MyQuery NVARCHAR(MAX) = 'INSERT INTO MyBQTable1
SOURCE(
''ODBC''
, ''Driver={ZappySys JSON Driver};Src=''''https://some-url/get-data''''''
, ''SELECT Col1 AS SomeBQCol1, Col2 AS SomeBQCol2 FROM _root_''
)
--WITH(DatasetId=''TestDataset'',Output=''*'')
--WITH(DatasetId=''TestDataset'',ProjectId=''MyProjectId'',Output=''*'')';
EXEC (@MyQuery) AT [LS_TO_GOOGLE_BIGQUERY_IN_GATEWAY];
post_[dynamic_endpoint_name] endpoint belongs to
[Dynamic Table]
table(s), and can therefore be used via those table(s).